auto-encoding application
A New Distribution on the Simplex with Auto-Encoding Applications
We construct a new distribution for the simplex using the Kumaraswamy distribution and an ordered stick-breaking process. We explore and develop the theoretical properties of this new distribution and prove that it exhibits symmetry (exchangeability) under the same conditions as the well-known Dirichlet. Like the Dirichlet, the new distribution is adept at capturing sparsity but, unlike the Dirichlet, has an exact and closed form reparameterization--making it well suited for deep variational Bayesian modeling. We demonstrate the distribution's utility in a variety of semi-supervised auto-encoding tasks. In all cases, the resulting models achieve competitive performance commensurate with their simplicity, use of explicit probability models, and abstinence from adversarial training.
Reviews: A New Distribution on the Simplex with Auto-Encoding Applications
Originality: Although VAEs using a stick-breaking construction with Kumaraswamy distributions has been considered before (Nalisnick, Smyth, STICK-BREAKING VARIATIONAL AUTOENCODERS, 2017), the idea to use such a construction and extend it by mixing over the orderings to obtain a density more similar to a Dirichlet is new and interesting. Related work is adequately cited. Quality: The paper seems technically sound and claims are largely supported. Although Theorem 1 is a standard result, reiterating it is likely useful for the subsequent exposition. Experimental results show that the method outperforms some baselines, however, I feel that some additional experiments would be useful (see details below in Section 5. Improvements).
A New Distribution on the Simplex with Auto-Encoding Applications
We construct a new distribution for the simplex using the Kumaraswamy distribution and an ordered stick-breaking process. We explore and develop the theoretical properties of this new distribution and prove that it exhibits symmetry (exchangeability) under the same conditions as the well-known Dirichlet. Like the Dirichlet, the new distribution is adept at capturing sparsity but, unlike the Dirichlet, has an exact and closed form reparameterization--making it well suited for deep variational Bayesian modeling. We demonstrate the distribution's utility in a variety of semi-supervised auto-encoding tasks. In all cases, the resulting models achieve competitive performance commensurate with their simplicity, use of explicit probability models, and abstinence from adversarial training.
A New Distribution on the Simplex with Auto-Encoding Applications
Stirn, Andrew, Jebara, Tony, Knowles, David
We construct a new distribution for the simplex using the Kumaraswamy distribution and an ordered stick-breaking process. We explore and develop the theoretical properties of this new distribution and prove that it exhibits symmetry (exchangeability) under the same conditions as the well-known Dirichlet. Like the Dirichlet, the new distribution is adept at capturing sparsity but, unlike the Dirichlet, has an exact and closed form reparameterization--making it well suited for deep variational Bayesian modeling. We demonstrate the distribution's utility in a variety of semi-supervised auto-encoding tasks. In all cases, the resulting models achieve competitive performance commensurate with their simplicity, use of explicit probability models, and abstinence from adversarial training.